Goto

Collaborating Authors

 learning attractor dynamic


Learning Attractor Dynamics for Generative Memory

Neural Information Processing Systems

A central challenge faced by memory systems is the robust retrieval of a stored pattern in the presence of interference due to other stored patterns and noise. A theoretically well-founded solution to robust retrieval is given by attractor dynamics, which iteratively cleans up patterns during recall. However, incorporating attractor dynamics into modern deep learning systems poses difficulties: attractor basins are characterised by vanishing gradients, which are known to make training neural networks difficult. In this work, we exploit recent advances in variational inference and avoid the vanishing gradient problem by training a generative distributed memory with a variational lower-bound-based Lyapunov function. The model is minimalistic with surprisingly few parameters. Experiments shows it converges to correct patterns upon iterative retrieval and achieves competitive performance as both a memory model and a generative model.



Reviews: Learning Attractor Dynamics for Generative Memory

Neural Information Processing Systems

This paper proposes a generative model which builds on ideas from dynamical systems and previous deep learning work like the Kanerva Machine. The main idea is to design and train an architecture that, when unrolled as a dynamical system, has points from the target distribution as attractors. I found the presentation of the model reasonably clear, but thought it suffered from excessive formality. E.g., the description of p(M) could just say that the rows of M are isotropic Gaussian distributions with each row having its own mean and scaled-identity covariance. The references to matrix-variate Gaussians, Kronecker products, vectorization operators, etc. don't contribute to clarity.


Learning Attractor Dynamics for Generative Memory

Wu, Yan, Wayne, Gregory, Gregor, Karol, Lillicrap, Timothy

Neural Information Processing Systems

A central challenge faced by memory systems is the robust retrieval of a stored pattern in the presence of interference due to other stored patterns and noise. A theoretically well-founded solution to robust retrieval is given by attractor dynamics, which iteratively cleans up patterns during recall. However, incorporating attractor dynamics into modern deep learning systems poses difficulties: attractor basins are characterised by vanishing gradients, which are known to make training neural networks difficult. In this work, we exploit recent advances in variational inference and avoid the vanishing gradient problem by training a generative distributed memory with a variational lower-bound-based Lyapunov function. The model is minimalistic with surprisingly few parameters. Experiments shows it converges to correct patterns upon iterative retrieval and achieves competitive performance as both a memory model and a generative model.